When Reduced Connectivity Neural Nets

نویسنده

  • Valeriu Beiu
چکیده

The starting points of this paper are two size-optimal solutions: (i) one for implementing arbitrary Boolean functions (Horne, 1994); and (ii) another one for implementing a particular sub-class of Boolean functions (Red’kin, 1970). Because VLSI implementations do not cope well with highly interconnected nets—the area of a chip grows with the cube of the fan-in (Hammerstrom, 1988)—this paper will analyse the influence of limited fan-in on the size optimality for the two solutions mentioned. First, we will extend a result from Horne & Hush (1994) valid for fan-in ∆ = 2 to arbitrary fan-in. Second, we will prove that size-optimal solutions are obtained for small constant fan-in for both constructions, while relative minimum size solutions can be obtained for fan-ins strictly lower that linear. These results are in agreement with similar ones proving that for small constant fan-ins (∆ = 6...9) there exist VLSI-optimal (i.e., minimising AT ) solutions (Beiu, 1997c), and with the small constants (5...9) relating to our capacity of processing information (Miller, 1956).

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Solving Fuzzy Equations Using Neural Nets with a New Learning Algorithm

Artificial neural networks have the advantages such as learning, adaptation, fault-tolerance, parallelism and generalization. This paper mainly intends to offer a novel method for finding a solution of a fuzzy equation that supposedly has a real solution. For this scope, we applied an architecture of fuzzy neural networks such that the corresponding connection weights are real numbers. The ...

متن کامل

Autoassociative memory retrieval and spontaneous activity bumps in small-world networks of integrate-and-fire neurons.

The metric structure of synaptic connections is obviously an important factor in shaping the properties of neural networks, in particular the capacity to retrieve memories, with which are endowed autoassociative nets operating via attractor dynamics. Qualitatively, some real networks in the brain could be characterized as 'small worlds', in the sense that the structure of their connections is i...

متن کامل

Solving Fuzzy Equations Using Neural Nets with a New Learning Algorithm

Artificial neural networks have the advantages such as learning, adaptation, fault-tolerance, parallelism and generalization. This paper mainly intends to offer a novel method for finding a solution of a fuzzy equation that supposedly has a real solution. For this scope, we applied an architecture of fuzzy neural networks such that the corresponding connection weights are real numbers. The ...

متن کامل

Cerebra as Fractal Neural Vector-Machines

Until today the functioning of central nervous systems remainsenigmatic in main aspects. Therefore alternative models of neural nets and their ways to process information might be of interest. Fractal neural nets, their connectivity being completely determined by fractal functions, show interesting features as regions of disand convergence and connections between distant regions of the net dire...

متن کامل

Prediction of Gain in LD-CELP Using Hybrid Genetic/PSO-Neural Models

In this paper, the gain in LD-CELP speech coding algorithm is predicted using three neural models, that are equipped by genetic and particle swarm optimization (PSO) algorithms to optimize the structure and parameters of neural networks. Elman, multi-layer perceptron (MLP) and fuzzy ARTMAP are the candidate neural models. The optimized number of nodes in the first and second hidden layers of El...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1998